翻訳と辞書
Words near each other
・ Hierarchical internetworking model
・ Hierarchical matrix
・ Hierarchical model–view–controller
・ Hierarchical modulation
・ Hierarchical Music Specification Language
・ Hierarchical namespace
・ Hierarchical network model
・ Hierarchical organization
・ Hierarchical proportion
・ Hierarchical RBF
・ Hierarchical routing
・ Hierarchical state routing
・ Hierarchical storage management
・ Hierarchical structure of the Big Five
・ Hierarchical task network
Hierarchical temporal memory
・ Hierarchical value cache
・ Hierarchy
・ Hierarchy (disambiguation)
・ Hierarchy (mathematics)
・ Hierarchy and Free Expression in the Fight Against Racism
・ Hierarchy of angels
・ Hierarchy of beliefs
・ Hierarchy of death
・ Hierarchy of evidence
・ Hierarchy of genres
・ Hierarchy of hazard control
・ Hierarchy of precious substances
・ Hierarchy of roads
・ Hierarchy of the Catholic Church


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Hierarchical temporal memory : ウィキペディア英語版
Hierarchical temporal memory

Hierarchical temporal memory (HTM) is an online machine learning model developed by Jeff Hawkins and Dileep George of Numenta, Inc. that models some of the structural and algorithmic properties of the neocortex. HTM is a biomimetic model based on the memory-prediction theory of brain function described by Jeff Hawkins in his book ''On Intelligence''. HTM is a method for discovering and inferring the high-level causes of observed input patterns and sequences, thus building an increasingly complex model of the world.
Jeff Hawkins states that HTM does not present any new idea or theory, but combines existing ideas to mimic the neocortex with a simple design that provides a large range of capabilities. HTM combines and extends approaches used in Sparse distributed memory, Bayesian networks, spatial and temporal clustering algorithms, while using a tree-shaped hierarchy of nodes that is common in neural networks.
== HTM structure and algorithms ==

A typical HTM network is a tree-shaped hierarchy of ''levels'' that are composed of smaller elements called ''nodes'' or ''columns''. A single level in the hierarchy is also called a ''region''. Higher hierarchy levels often have fewer nodes and therefore less spacial resolvability. Higher hierarchy levels can reuse patterns learned at the lower levels by combining them to memorize more complex patterns.
Each HTM node has the same basic functionality. In learning and inference modes, sensory data comes into the bottom level nodes. In generation mode, the bottom level nodes output the generated pattern of a given category. The top level usually has a single node that stores the most general categories (concepts) which determine, or are determined by, smaller concepts in the lower levels which are more restricted in time and space. When in inference mode, a node in each level interprets information coming in from its child nodes in the lower level as probabilities of the categories it has in memory.
Each HTM region learns by identifying and memorizing spatial patterns - combinations of input bits that often occur at the same time. It then identifies temporal sequences of spatial patterns that are likely to occur one after another.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Hierarchical temporal memory」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.